9 research outputs found

    Explainable Persuasion for Persuasive Interfaces: The Case of Online Gambling

    Get PDF
    As human attention is a scarce resource, interactive online platforms such as social networks, gaming and online gambling platforms utilise persuasive interfaces to maximise user engagement. However, ethical concerns may arise since persuasive systems influence user behaviours. While interacting with persuasive systems, users may be unaware of being persuaded or unaware of the negative consequences that may result from interacting with persuasive systems. This can hinder users’ ability to evaluate the persuasion attempt and regulate their behaviour. Moreover, persuasive systems designed to maximise user engagement may, in some cases, trigger or reinforce addictive usage. There is evidence in the literature that online persuasive interfaces may influence psychological and cognitive mechanisms related to addictive behaviour. Transparency and user voluntariness are proposed to be the building blocks of ethical persuasive systems. However, to date, the concept of transparent persuasive technology mainly remained philosophical in academia. One approach to designing persuasive systems that adhere to the transparency and user voluntariness requirements could be fulfilling conditions for informed consent. When interacting with persuasive systems, users could be informed about the persuasive design techniques used by the system, and such information may help users build resilience against persuasion attempts made by the system. Such an approach aligns with the principles outlined in the software engineering code of ethics of avoiding harm and maintaining honesty and trustworthiness. This thesis aims to introduce and evaluate the concept of explainable persuasion in the context of designing ethical digital persuasive interfaces with an analogy to explainable artificial intelligence. A mixed methods approach was conducted to achieve this goal. The thesis focused on a distinct domain, online gambling, as gambling disorder is recognised as a mental disorder by health organisations. Accordingly, a scoping review was conducted first to identify the main persuasive design techniques utilised in online gambling platforms. Identified persuasive design techniques were analysed for their potential to facilitate gambling disorder through the addiction literature. An online survey was then conducted to examine users’ awareness of persuasive design techniques used in online gambling platforms and users’ attitudes towards the concept of explainable persuasion. Finally, an online experiment was conducted to determine the effectiveness of explainable persuasion as an inoculation intervention in building resilience against persuasive design techniques used in online gambling platforms. The findings of the user studies showed that explainable persuasion was accepted and that it could be a promising solution for designing persuasive interfaces that promote informed choice and strengthen resilience against persuasion if it is not compatible with users’ personal goals. This thesis contributes to transparency and explainability literature as it is one of the first attempts to examine the role of explainability in the domain of persuasive technology which may also have addictive potential. Identifying acceptance and rejection factors of explainable persuasion can help design persuasive interfaces that promote informed usage and meet ethical requirements. This implication does not only apply to persuasive technology but can also be generalised to research areas such as combatting fake news and social engineering. The findings are expected to have important implications for gambling operators and regulators in expanding the scope of responsible gambling practices to ensure explainability and transparency. The results are expected to also benefit wider application areas such as explainability in other contents and interfaces related to marketing, news and recommendations made by or facilitated by intelligent systems

    Combatting digital addiction: Current approaches and future directions

    Get PDF
    In recent years, the notion of digital addiction has become popular. Calls for solutions to combat it, especially in adolescents, are on the rise. Whilst there remains debate on the status of this phenomenon as a diagnosable mental health condition; there is a need for prevention and intervention approaches that encourage individuals to have more control over their digital usage. This narrative review examines digital addiction countermeasures proposed in the last ten years. By countermeasures, we mean strategies and techniques for prevention, harm reduction, and intervention towards addictive digital behaviours. We include studies published in peer-reviewed journals between 2010 and 2021 and based on empirical evidence. In total, 87 studies were included in the review. The findings show that the main countermeasures could be grouped under four categories: psycho-social, software mediated, pharmacological, and combined. Overall, it has been shown that the proposed countermeasures were effective in reducing addictive digital use. However, a general statement on the efficacy of proposed countermeasures cannot be made due to inconsistent conceptualisation of digital addiction and methodological weaknesses. Accordingly, this review highlights issues that need to be addressed in future studies

    Explainable Persuasion in Interactive Design

    Get PDF
    Persuasive technology refers to the use of digital means to influence attitude, behaviour, and decisions. While a professional design of persuasive interfaces shall consider user interests and freedom of choice as a primary requirement, principles, and methods to achieve it are yet to be introduced. In the design of persuasive interfaces, fulfilling conditions of informed consent can help establish transparency and resolve such ethical issues. This paper introduces the concept of explainable persuasion as a way to address informed consent within persuasive interfaces. We provide a definition for explainable persuasion, highlight the need for it, discuss the design approach and underline the challenges to be addressed when designing explainable persuasive interfaces

    Explainable persuasion for interactive design: The case of online gambling.

    Get PDF
    Persuasive technology refers to digital means that influence attitude behaviour, and decisions. While the professional design of persuasive interfaces considers user interests and freedom of choice a primary requirement, principles and methods to achieve it are yet to be introduced. In the design of persuasive interfaces, fulfilling conditions of informed consent can help establish transparency and address such ethical issues. This paper defined explainable persuasion, its potential form, and benefits and explored whether explainable persuasion is a user requirement on demand. This paper further examined explainable persuasion design from the user’s perspective and reported on acceptance and rejection factors, as well as possible design tensions and solutions. In this study, we took online gambling as a case study. A total of 250 UK-based users of gambling platforms (age range 18 – 75, 18–75, 127 female) completed our online survey based on principles of persuasion and explainability. Findings showed that players were aware of the use, persuasive intent, and potential harm of various persuasive design techniques used in online gambling platforms (e.g., the use of in-game rewards, reminders, and praise to encourage further gambling). Despite this awareness, they agreed that explainable persuasion can still help users stay in control of their online experience, increase their positive attitude towards the online system, and keep them reminded of the potential side effects of persuasive interfaces. Future research is required to enhance the design and implementation of explainable persuasion in persuasive interfaces

    The Fine Line Between Persuasion and Digital Addiction

    Get PDF
    Digital addiction is becoming a prevalent societal concern and persuasive design techniques used in digital platforms might be accountable also for the development and maintenance of such problematic behavior. This paper theoretically analyses the relationship between persuasive system design principles and digital addiction in light of theories on behavioral and substance-based addictions. The findings suggest that some of the persuasive design principles, in specific contexts, may trigger and expedite digital addiction. The purpose of this paper is to open a discussion around the potential effects of persuasive technology on digital addiction and cater to this risk in the design processes and the persuasive design itself

    Explainable recommendations and calibrated trust: two systematic users’ errors

    Get PDF
    The increased adoption of collaborative Human-AI decision-making tools triggered a need to explain the recommendations for safe and effective collaboration. However, evidence from the recent literature showed that current implementation of AI explanations is failing to achieve adequate trust calibration. Such failure has lead decision-makers to either end-up with over-trust, e.g., people follow incorrect recommendations or under-trust, they reject a correct recommendation. In this paper, we explore how users interact with explanations and why trust calibration errors occur. We take clinical decision-support systems as a case study. Our empirical investigation is based on think-aloud protocol and observations, supported by scenarios and decision-making exercise utilizing a set of explainable recommendations interfaces. Our study involved 16 participants from medical domain who use clinical decision support systems frequently. Our findings showed that participants had two systematic errors while interacting with the explanations either by skipping them or misapplying them in their task

    Explainability as a Psychological Inoculation: Building Resistance to Digital Persuasion in Online Gambling through Explainable Interfaces

    Get PDF
    Persuasive interfaces raise ethical concerns when users are unaware of persuasion or find it hard to resist it. Inoculation Theory suggests that attitudes can be inoculated against persuasive attacks. Studies show that disclosure statements in native advertising help people recognize persuasive intent. Likewise, just-in-time disclosure statements in persuasive interfaces may have a similar effect. In this article, explainable persuasion was used as an inoculation intervention to build resistance against persuasive interfaces. The effectiveness of this approach was assessed via a 4x2 online experiment, taking online gambling as an illustrative domain. 240 participants (age range 18–73 years, 138 male, 100 female, 2 participants choose not to disclose) were recruited from the UK. Inoculation was delivered through an animated video, while explainable persuasion was operationalized through the disclosure of persuasive intent. The findings showed that explainable persuasion increased awareness of the presence and risks of persuasive interfaces and strengthened user resistance to persuasive attempts. Explainable persuasion, being information-based, can be a cost-effective strategy for helping people stay in control over their digital usage while engaging with persuasive technologies

    Why do we not stand up to misinformation? Factors influencing the likelihood of challenging misinformation on social media and the role of demographics

    Get PDF
    This study investigates the barriers to challenging others who post misinformation on social media platforms. We conducted a survey amongst U.K. Facebook users (143 (57.2 %) women, 104 (41.6 %) men) to assess the extent to which the barriers to correcting others, as identified in literature across disciplines, apply to correcting misinformation on social media. We also group the barriers into factors and explore demographic differences amongst them. It has been suggested that users are generally hesitant to challenge misinformation. We found that most of our participants (58.8 %) were reluctant to challenge misinformation. We also identified moderating roles of age and gender in the likelihood of challenging misinformation. Older people were more likely to challenge misinformation compared to young adults while, men demonstrated a slightly greater likelihood to challenge compared to women. The 20 barriers influencing the decision to challenge misinformation, were then grouped into four main factors: social concerns, effort/interest considerations, prosocial intents, and content-related factors. We found that, controlling for age and gender, “social concerns” and “effort/interest considerations” have the significant impact on likelihood to challenge. Identified four factors were analysed in terms of demographic differences. Men ranked “effort/interest considerations” higher than women, while women placed higher importance on “content-related factors”. Moreover, older individuals were found to be more resilient to “social concerns”. The influence of educational background was most prominent in ranking “content-related factors”. Our findings provide important insights for the design of future interventions aimed at encouraging the challenging of misinformation on social media platforms, highlighting the need for tailored, demographically sensitive approaches

    Explainable Recommendations and Calibrated Trust: Two Systematic User Errors

    No full text
    The increased adoption of collaborative human-artificial intelligence decision-making tools triggered a need to explain recommendations for safe and effective collaboration. We explore how users interact with explanations and why trust-calibration errors occur, taking clinical decision-support systems as a case study
    corecore